Approximation by Fully Complex MLP Using Elementary Transcendental Activation Functions

نویسندگان

  • Taehwan Kim
  • Tülay Adali
چکیده

Recently, we have presented ‘fully’ complex multi-layer perceptrons (MLPs) using a subset of complex elementary transcendental functions as the nonlinear activation functions. These functions jointly process the inphase (I) and quadrature (Q) components of data while taking full advantage of well-defined gradients in the error back-propagation. In this paper, the characteristics of these elementary transcendental functions are categorized and their common almost everywhere (a.e.) bounded and analytic properties are investigated. More importantly, it is proved that fully complex MLPs are a.e. convergent and therefore are capable of universally approximating any nonlinear complex mapping to an arbitrary accuracy. Numerical examples demonstrate the benefit of isolated essential singularity included in a subgroup of elementary transcendental functions in achieving arbitrarily close approximation to the desired mapping.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximation by Fully Complex Multilayer Perceptrons

We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extended to the complex domain. The main challenge for processing complex data with neural networks has been the lack of bounded and analytic complex nonlinear activation functions in the complex domain, as stated by Liouville's theorem. To avoid the conflict between the boundedness and the analyticity ...

متن کامل

Uniform Approximations for Transcendental Functions

A heuristic method to construct uniform approximations to analytic transcendental functions is developed as a generalization of the Hermite-Padé interpolation to infinite intervals. The resulting uniform approximants are built from elementary functions using known series and asymptotic expansions of the given transcendental function. In one case (Lambert’s W function) we obtained a uniform appr...

متن کامل

Complex backpropagation neural network using elementary transcendental activation functions

Designing a neural network (NN) for processing complex signals is a challenging task due to the lack of bounded and differentiable nonlinear activation functions in the entire complex domain C. To avoid this difficulty, 'splitting', i.e., using uncoupled real sigmoidal functions for the real and imaginary components has been the traditional approach, and a number of fully complex activation fun...

متن کامل

Lower bounds for approximation by MLP neural networks

Abstract. The degree of approximation by a single hidden layer MLP model with n units in the hidden layer is bounded below by the degree of approximation by a linear combination of n ridge functions. We prove that there exists an analytic, strictly monotone, sigmoidal activation function for which this lower bound is essentially attained. We also prove, using this same activation function, that...

متن کامل

A Logarithmic Neural Network Architecture for Pra Approximation

A neural network based risk monitor was designed to emulate the results of a Nuclear Power Plant probability risk assessment. Although multilayer feedforward neural networks with sigmoidal activation functions have been termed universal function approximators, this approximation may require an inordinate number of hidden nodes and is only accurate over a finite interval. These short comings are...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001